Graph Neural Networks: Architectures, Stability, and Transferability

نویسندگان

چکیده

Graph neural networks (GNNs) are information processing architectures for signals supported on graphs. They presented here as generalizations of convolutional (CNNs) in which individual layers contain banks graph filters instead classical filters. Otherwise, GNNs operate CNNs. Filters composed pointwise nonlinearities and stacked layers. It is shown that GNN exhibit equivariance to permutation stability deformations. These properties help explain the good performance can be observed empirically. also if graphs converge a limit object, graphon, corresponding graphon network. This convergence justifies transferability across with different numbers nodes. Concepts illustrated by application recommendation systems, decentralized collaborative control, wireless communication networks.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

rodbar dam slope stability analysis using neural networks

در این تحقیق شبکه عصبی مصنوعی برای پیش بینی مقادیر ضریب اطمینان و فاکتور ایمنی بحرانی سدهای خاکی ناهمگن ضمن در نظر گرفتن تاثیر نیروی اینرسی زلزله ارائه شده است. ورودی های مدل شامل ارتفاع سد و زاویه شیب بالا دست، ضریب زلزله، ارتفاع آب، پارامترهای مقاومتی هسته و پوسته و خروجی های آن شامل ضریب اطمینان می شود. مهمترین پارامتر مورد نظر در تحلیل پایداری شیب، بدست آوردن فاکتور ایمنی است. در این تحقیق ...

Neural Networks and Graph Transformations

As the beginning of the area of artificial neural networks the introduction of the artificial neuron by McCulloch and Pitts is considered. They were inspired by the biological neuron. Since then many new networks or new algorithms for neural networks have been invented with the result. In most textbooks on (artificial) neural networks there is no general definition on what a neural net is but r...

متن کامل

Nonlinear neural networks: Principles, mechanisms, and architectures

An historical discussion is provided of the intellectual trends that caused nineteenth century interdisciplinary studies of physics and psychobiology by leading scientists such as Helmholtz, Maxwell, and Mach to splinter into separate twentieth-century scientific movements. The nonlinear, nonstationary, and nonlocal nature of behavioral and brain data are emphasized. Three sources of contempora...

متن کامل

Neural Networks: Algorithms and Special Architectures

The paper is focused on neural networks, their learning algorithms, special architecture and SVM. General learning rule as a function of the incoming signals is discussed. Other learning rules such as Hebbian learning, delta learning, perceptron learning, Least Mean Square (LMS) learning, Winner Take All (WTA) learning are presented as a derivation of the general learning rule. Architecture spe...

متن کامل

Deriving Neural Architectures from Sequence and Graph Kernels

The design of neural architectures for structured objects is typically guided by experimental insights rather than a formal process. In this work, we appeal to kernels over combinatorial structures, such as sequences and graphs, to derive appropriate neural operations. We introduce a class of deep recurrent neural operations and formally characterize their associated kernel spaces. Our recurren...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the IEEE

سال: 2021

ISSN: ['1558-2256', '0018-9219']

DOI: https://doi.org/10.1109/jproc.2021.3055400